.3 * log(.3) e <- exp(1) e^(-1.203973) # = .3 # log(.3) = -1.203973 log(9) log(9, base = 3)
2019-03-12
p <- c("heads"=.7, "tails"=.3) (ENTROPY <- -sum( p * log(p) ))
The entropy of the coin is approximately r round(ENTROPY, 2)
.
p <- c("1"=.2, "2"=.25, "3"=.25, "4"=.3) (ENTROPY <- -sum( p * log(p) ))
The entropy of the die is approximately r round(ENTROPY, 2)
.
p <- c("1"=.33, "2"=.33, "3"=.33) (ENTROPY <- -sum( p * log(p) ))
The entropy of the die is approximately r round(ENTROPY, 2)
.
The Akaike Information Criterion (AIC) approximates predictive accuracy. It approximates out-of-sample deviance (AKA test deviance) by summing in-sample deviance
and (2 * the number of parameters
).
The Deviance Information Criterion (DIC) is a Bayesian information criterion. It assumes a multivariate Gaussian posterior distribution, but accommodates informative priors. It is the average posterior distribution of deviance + the number of parameters.
The Widely Applicable Information Criterion (WAIC) estimates out-of-sample deviance (AKA test deviance) without assuming a multivariate Gaussian posterior distribution, because it is pointwise. It handles uncertainty at each particular observation. It is equal to -2 * (log-pointwise-predictive-density
- the effective number of parameters
)
The WAIC is most general, because it does not assume a multivariate Gaussian posterior nor uninformative priors.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.